conditional-probability-distribution entropy

conditional-probability-distribution entropy
Общая лексика: энтропия условного распределения вероятностей (мера неопределенности условного распределения вероятностей дискретной случайной величины при условии, что задано значение другой дискретной случайной величины)

Универсальный англо-русский словарь. . 2011.

Игры ⚽ Нужна курсовая?

Смотреть что такое "conditional-probability-distribution entropy" в других словарях:

  • Maximum entropy probability distribution — In statistics and information theory, a maximum entropy probability distribution is a probability distribution whose entropy is at least as great as that of all other members of a specified class of distributions. According to the principle of… …   Wikipedia

  • probability theory — Math., Statistics. the theory of analyzing and making statements concerning the probability of the occurrence of uncertain events. Cf. probability (def. 4). [1830 40] * * * Branch of mathematics that deals with analysis of random events.… …   Universalium

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Conditional mutual information — In probability theory, and in particular, information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third. Contents 1 Definition 2… …   Wikipedia

  • Geometric distribution — Probability distribution two name =Geometric type =mass pdf cdf | parameters =0< p leq 1 success probability (real) support =k in {1,2,3,dots}! pdf =(1 p)^{k 1},p! cdf =1 (1 p)^k! mean =frac{1}{p}! median =leftlceil frac{ log(2)}{log(1 p)} ight… …   Wikipedia

  • Conditional entropy — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In information theory, the conditional entropy (or equivocation) quantifies the remaining entropy (i.e.… …   Wikipedia

  • Noncentral chi-square distribution — Probability distribution name =Noncentral chi square type =density pdf cdf parameters =k > 0, degrees of freedom lambda > 0, non centrality parameter support =x in [0; +infty), pdf =frac{1}{2}e^{ (x+lambda)/2}left (frac{x}{lambda} ight)^{k/4 1/2} …   Wikipedia

  • Conditional random field — A conditional random field (CRF) is a statistical modelling method often applied in pattern recognition. More specifically it is a type of discriminative undirected probabilistic graphical model. It is used to encode known relationships between… …   Wikipedia

  • Exponential distribution — Not to be confused with the exponential families of probability distributions. Exponential Probability density function Cumulative distribution function para …   Wikipedia

  • Normal distribution — This article is about the univariate normal distribution. For normally distributed vectors, see Multivariate normal distribution. Probability density function The red line is the standard normal distribution Cumulative distribution function …   Wikipedia

  • List of probability topics — This is a list of probability topics, by Wikipedia page. It overlaps with the (alphabetical) list of statistical topics. There are also the list of probabilists and list of statisticians.General aspects*Probability *Randomness, Pseudorandomness,… …   Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»